128 research outputs found

    Joint Aerial-Terrestrial Resource Management in UAV-Aided Mobile Radio Networks

    Get PDF
    This article addresses the issue of joint aerial- terrestrial resource management in mobile radio networks supported by a UAV operating as network node and discusses the potential of true integration between the terrestrial and UAV components of the network. A simulation campaign shows that, by properly optimizing the system parameters related to the UAV flight, a single UAV can bring significant improvement in network throughput for a wide service area. The use of a joint radio resource management approach, where the UAV and terrestrial base stations operate in a coordinated manner, brings significant advantages with respect to different algorithms

    Cross-Layer Design of an Energy-Efficient Cluster Formation Algorithm with Carrier-Sensing Multiple Access for Wireless Sensor Networks

    Get PDF
    A new energy-efficient scheme for data transmission in a wireless sensor network (WSN) is proposed, having in mind a typical application including a sink, which periodically triggers the WSN, and nodes uniformly distributed over a specified area. Routing, multiple access control (MAC), physical, energy, and propagation aspects are jointly taken into account through simulation; however, the protocol design is based on some analytical considerations reported in the appendix. Information routing is based on a clustered self-organized structure; a carrier-sensing multiple access (CSMA) protocol is chosen at MAC layer. Two different scenarios are examined, characterized by different channel fading rates. Four versions of our protocol are presented, suitably oriented to the two different scenarios; two of them implement a cross-layer (CL) approach, where MAC parameters influence both the network and physical layers. Performance is measured in terms of network lifetime (related to energy efficiency) and packet loss rate (related to network availability). The paper discusses the rationale behind the selection of MAC protocols for WSNs and provides a complete model characterization spanning from the network layer to the propagation channel. The advantages of the CL approach, with respect to an algorithm which belongs to the well-known class of low-energy adaptive clustering hierarchy (LEACH) protocols, are shown

    Decentralized detection in IEEE 802.15.4 wireless sensor networks

    Get PDF
    We present a mathematical model to study decentralized detection in clustered wireless sensor networks (WSNs). Sensors and fusion centers (FCs) are distributed with the aim of detecting an event of interest. Sensors are organized in clusters, with FCs acting as cluster heads, and are supposed to observe the same common binary phenomenon. A query-based application is accounted for; FCs periodically send queries and wait for replies coming from sensors. After reception of data, FCs perform data fusion with a majority-like fusion rule and send their decisions to an access point (AP), where a final data fusion is carried out and an estimate of the phenomenon is obtained. We assume that sensors are IEEE 802.15.4-compliant devices and use the medium access control (MAC) protocol defined by the standard, based on carrier-sense multiple access with collision avoidance. Decentralized detection and MAC issues are jointly investigated through analytical modelling. The proposed framework allows the derivation of the probability of decision error at the AP, when accounting for packets' losses due to possible collisions. Our results show that MAC losses strongly affect system performance. The impact of different clustering configurations and of noisy communications is also investigated

    Neighbors-Aware Proportional Fair scheduling for future wireless networks with mixed MAC protocols

    Get PDF
    Abstract In this paper, we consider a beyond-5G scenario, where two types of users, denoted as scheduled and uncoordinated nodes, coexist on the same set of radio resources for sending data to a base station. Scheduled nodes rely solely on a centralized scheduler within the base station for the assignment of resources, while uncoordinated nodes use an unslotted Carrier Sense Multiple Access (CSMA) protocol for channel access. We propose and evaluate through simulations: (a) a novel centralized resource scheduling algorithm, called Neighbors-Aware Proportional Fair (N-PF) and (b) a novel packet length adaptation algorithm, called Channel-Aware (CA) Packet Length Adaptation algorithm for the scheduled nodes. The N-PF algorithm considers the uplink channel state conditions and the number of uncoordinated nodes neighboring each scheduled node in the aggregate scheduling metric, in order to maximize packet transmission success probability. The CA algorithm provides an additional degree of freedom for improving the performance, thanks to the fact that scheduled nodes with lower number of hidden terminals, i.e., having higher packet capture probability, are assigned longer packet transmission opportunities. We consider two benchmark schemes: Proportional Fair (PF) algorithm, as a resource scheduling algorithm, and a discrete uniform distribution (DUD) scheme for packet lengths distribution. Simulation results show that the proposed schemes can result in significant gain in terms of network goodput, without compromising fairness, with respect to two benchmark solutions taken from the literature

    Distributed Resource Allocation for URLLC in IIoT Scenarios: A Multi-Armed Bandit Approach

    Full text link
    This paper addresses the problem of enabling inter-machine Ultra-Reliable Low-Latency Communication (URLLC) in future 6G Industrial Internet of Things (IIoT) networks. As far as the Radio Access Network (RAN) is concerned, centralized pre-configured resource allocation requires scheduling grants to be disseminated to the User Equipments (UEs) before uplink transmissions, which is not efficient for URLLC, especially in case of flexible/unpredictable traffic. To alleviate this burden, we study a distributed, user-centric scheme based on machine learning in which UEs autonomously select their uplink radio resources without the need to wait for scheduling grants or preconfiguration of connections. Using simulation, we demonstrate that a Multi-Armed Bandit (MAB) approach represents a desirable solution to allocate resources with URLLC in mind in an IIoT environment, in case of both periodic and aperiodic traffic, even considering highly populated networks and aggressive traffic.Comment: 2022 IEEE Globecom Workshops (GC Wkshps): Future of Wireless Access and Sensing for Industrial IoT (FutureIIoT

    Cellular network capacity and coverage enhancement with MDT data and Deep Reinforcement Learning

    Get PDF
    Recent years witnessed a remarkable increase in the availability of data and computing resources in comm-unication networks. This contributed to the rise of data-driven over model-driven algorithms for network automation. This paper investigates a Minimization of Drive Tests (MDT)-driven Deep Reinforcement Learning (DRL) algorithm to optimize coverage and capacity by tuning antennas tilts on a cluster of cells from TIM's cellular network. We jointly utilize MDT data, electromagnetic simulations, and network Key Performance indicators (KPIs) to define a simulated network environment for the training of a Deep Q-Network (DQN) agent. Some tweaks have been introduced to the classical DQN formulation to improve the agent's sample efficiency, stability and performance. In particular, a custom exploration policy is designed to introduce soft constraints at training time. Results show that the proposed algorithm outperforms baseline approaches like DQN and best-first search in terms of long-term reward and sample efficiency. Our results indicate that MDT -driven approaches constitute a valuable tool for autonomous coverage and capacity optimization of mobile radio networks

    Smart city pilot projects using LoRa and IEEE802.15.4 technologies

    Get PDF
    Information and Communication Technologies (ICTs), through wireless communications and the Internet of Things (IoT) paradigm, are the enabling keys for transforming traditional cities into smart cities, since they provide the core infrastructure behind public utilities and services. However, to be effective, IoT-based services could require different technologies and network topologies, even when addressing the same urban scenario. In this paper, we highlight this aspect and present two smart city testbeds developed in Italy. The first one concerns a smart infrastructure for public lighting and relies on a heterogeneous network using the IEEE 802.15.4 short-range communication technology, whereas the second one addresses smart-building applications and is based on the LoRa low-rate, long-range communication technology. The smart lighting scenario is discussed providing the technical details and the economic benefits of a large-scale (around 3000 light poles) flexible and modular implementation of a public lighting infrastructure, while the smart-building testbed is investigated, through measurement campaigns and simulations, assessing the coverage and the performance of the LoRa technology in a real urban scenario. Results show that a proper parameter setting is needed to cover large urban areas while maintaining the airtime sufficiently low to keep packet losses at satisfactory levels
    corecore